Augmenting Clause Learning with Implied Literals
نویسندگان
چکیده
There exist various approaches in SAT solvers that aim at extending inference based on unit propagation. For instance, probing [5] simply applies unit propagation of literals at the root node in order to detect failed literals [3] or to populate literal implication lists. The latter information can then, for instance, be used to shrink clauses by hidden literal elimination (e.g., if a → b then (a ∨ b ∨ c) can be reduced to (b ∨ c); cf. [4]). Here we propose to strengthen clause learning by dynamically inferring lit-erals that the newly learned clause entails. We say that a literal l is an implied literal for a clause C if all literals of C entail l. For instance, if a → d, ¬b → d, and c → d, then (a ∨ ¬b ∨ c) entails d. While this insight has already been exploited in several methods (e.g., variations of hyper binary resolution and hidden literal elimination), we apply it to clause learning: when the SAT solver derives a new conflict clause c, we check if the literals in c imply a single or multiple literals which can then be propagated as new unit literals. In order to employ this technique we first need to generate implication lists L(l) = UnitPropagation(l) for each literal l. This is done at the root node of the search tree before the solving process starts and then periodically during search. During this computation, we also add not yet existing binary clauses corresponding to ∀l ∈ L(p) : ¬l → ¬p. As one might expect, we detect failed literals as well and add and propagate their negations as new unit literals; we do the same for all literals in the intersection of L(p) and L(¬p) for all p ∈ F. Once the implied literal lists for each literal are computed, we iterate over the clauses in the original theory F and propagate all implied literals as new unit clauses. Ideally we would like to augment these lists with new implications whenever new clauses have been learned by the solver. However, this operation can be computationally expensive and we must control how frequently it is performed. While learned clauses are usually 'forgotten' over time, we hold on to the implication lists and only extend them, when possible. Note that this can enable inference that might not be explicitly captured in the current clausal theory. Whenever …
منابع مشابه
Extending Clause Learning SAT Solvers with Complete Parity Reasoning (extended version)
Instances of logical cryptanalysis, circuit verification, and bounded model checking can often be succinctly represented as a combined satisfiability (SAT) problem where an instance is a combination of traditional clauses and parity constraints. This paper studies how such combined problems can be efficiently solved by augmenting a modern SAT solver with an xor-reasoning module in the DPLL(XOR)...
متن کاملSatisfiability Modulo Theories
DPLL During search, DPLL states are pairs M ||F where M is a truth assignment F is a set of clauses (problem clauses + learned clauses) The truth assignment is a list of literals: either decision literals (guesses) or implied literals (by unit propagation). If literal l is implied by unit propagation from clause C ∨ l, then the clause is recorded as the explanation for l. This is written lC∨l i...
متن کاملContributions to the Theory of Practical Quantified Boolean Formula Solving
Recent solvers for quantified boolean formulas (QBFs) use a clause learning method based on a procedure proposed by Giunchiglia et al. (JAIR 2006), which avoids creating tautological clauses. The underlying proof system is Q-resolution. This paper shows an exponential worst case for the clause-learning procedure. This finding confirms empirical observations that some formulas take mysteriously ...
متن کاملLearning Weakly Acyclic Horn Programs
We consider a general class of \weakly acyclic Horn programs" where the literals implied by the examples and the target clauses form an acyclic dependency graph. A Horn clause is transparent if all the terms in all its derivations from the target program are contained in the clause itself. A Horn program is transparent if all its clauses are transparent. We show that any subclass of rst-order w...
متن کاملConflict-Driven XOR-Clause Learning (extended version)
Modern conflict-driven clause learning (CDCL) SAT solvers are very good in solving conjunctive normal form (CNF) formulas. However, some application problems involve lots of parity (xor) constraints which are not necessarily efficiently handled if translated into CNF. This paper studies solving CNF formulas augmented with xor-clauses in the DPLL(XOR) framework where a CDCL SAT solver is coupled...
متن کامل